58 research outputs found

    Network and systems medicine: Position paper of the European Collaboration on Science and Technology action on Open Multiscale Systems Medicine

    Get PDF
    Introduction: Network and systems medicine has rapidly evolved over the past decade, thanks to computational and integrative tools, which stem in part from systems biology. However, major challenges and hurdles are still present regarding validation and translation into clinical application and decision making for precision medicine. Methods: In this context, the Collaboration on Science and Technology Action on Open Multiscale Systems Medicine (OpenMultiMed) reviewed the available advanced technologies for multidimensional data generation and integration in an open-science approach as well as key clinical applications of network and systems medicine and the main issues and opportunities for the future. Results: The development of multi-omic approaches as well as new digital tools provides a unique opportunity to explore complex biological systems and networks at different scales. Moreover, the application of findable, applicable, interoperable, and reusable principles and the adoption of standards increases data availability and sharing for multiscale integration and interpretation. These innovations have led to the first clinical applications of network and systems medicine, particularly in the field of personalized therapy and drug dosing. Enlarging network and systems medicine application would now imply to increase patient engagement and health care providers as well as to educate the novel generations of medical doctors and biomedical researchers to shift the current organ- and symptom-based medical concepts toward network- and systems-based ones for more precise diagnoses, interventions, and ideally prevention. Conclusion: In this dynamic setting, the health care system will also have to evolve, if not revolutionize, in terms of organization and management

    Alternative Splicing at a NAGNAG Acceptor Site as a Novel Phenotype Modifier

    Get PDF
    Approximately 30% of alleles causing genetic disorders generate premature termination codons (PTCs), which are usually associated with severe phenotypes. However, bypassing the deleterious stop codon can lead to a mild disease outcome. Splicing at NAGNAG tandem splice sites has been reported to result in insertion or deletion (indel) of three nucleotides. We identified such a mechanism as the origin of the mild to asymptomatic phenotype observed in cystic fibrosis patients homozygous for the E831X mutation (2623G>T) in the CFTR gene. Analyses performed on nasal epithelial cell mRNA detected three distinct isoforms, a considerably more complex situation than expected for a single nucleotide substitution. Structure-function studies and in silico analyses provided the first experimental evidence of an indel of a stop codon by alternative splicing at a NAGNAG acceptor site. In addition to contributing to proteome plasticity, alternative splicing at a NAGNAG tandem site can thus remove a disease-causing UAG stop codon. This molecular study reveals a naturally occurring mechanism where the effect of either modifier genes or epigenetic factors could be suspected. This finding is of importance for genetic counseling as well as for deciding appropriate therapeutic strategies

    SR protein-mediated inhibition of CFTR exon 9 inclusion: molecular characterization of the intronic splicing silencer

    Get PDF
    The intronic splicing silencer (ISS) of CFTR exon 9 promotes exclusion of this exon from the mature mRNA. This negative influence has important consequences with regards to human pathologic events, as lack of exon 9 correlates well with the occurrence of monosymptomatic and full forms of CF disease. We have previously shown that the ISS element interacts with members of the SR protein family. In this work, we now provide the identification of SF2/ASF and SRp40 as the specific SR proteins binding to this element and map their precise binding sites in IVS9. We have also performed a functional analysis of the ISS element using a variety of unrelated SR-binding sequences and different splicing systems. Our results suggest that SR proteins mediate CFTR exon 9 exclusion by providing a ‘decoy’ sequence in the vicinity of its suboptimal donor site. The results of this study give an insight on intron ‘exonization’ mechanisms and provide useful indications for the development of novel therapeutic strategies aimed at the recovery of exon inclusion

    SEPTIN12 Genetic Variants Confer Susceptibility to Teratozoospermia

    Get PDF
    It is estimated that 10–15% of couples are infertile and male factors account for about half of these cases. With the advent of intracytoplasmic sperm injection (ICSI), many infertile men have been able to father offspring. However, teratozoospermia still remains a big challenge to tackle. Septins belong to a family of cytoskeletal proteins with GTPase activity and are involved in various biological processes e.g. morphogenesis, compartmentalization, apoptosis and cytokinesis. SEPTIN12, identified by c-DNA microarray analysis of infertile men, is exclusively expressed in the post meiotic male germ cells. Septin12+/+/Septin12+/− chimeric mice have multiple reproductive defects including the presence of immature sperm in the semen, and sperm with bent neck (defect of the annulus) and nuclear DNA damage. These facts make SEPTIN12 a potential sterile gene in humans. In this study, we sequenced the entire coding region of SEPTIN12 in infertile men (n = 160) and fertile controls (n = 200) and identified ten variants. Among them is the c.474 G>A variant within exon 5 that encodes part of the GTP binding domain. The variant creates a novel splice donor site that causes skipping of a portion of exon 5, resulting in a truncated protein lacking the C-terminal half of SEPTIN12. Most individuals homozygous for the c.474 A allele had teratozoospermia (abnormal sperm <14%) and their sperm showed bent tail and de-condensed nucleus with significant DNA damage. Ex vivo experiment showed truncated SEPT12 inhibits filament formation in a dose-dependent manner. This study provides the first causal link between SEPTIN12 genetic variant and male infertility with distinctive sperm pathology. Our finding also suggests vital roles of SEPT12 in sperm nuclear integrity and tail development

    Consensus on the use and interpretation of cystic fibrosis mutation analysis in clinical practice

    Get PDF
    It is often challenging for the clinician interested in cystic fibrosis (CF) to interpret molecular genetic results, and to integrate them in the diagnostic process. The limitations of genotyping technology, the choice of mutations to be tested, and the clinical context in which the test is administered can all influence how genetic information is interpreted. This paper describes the conclusions of a consensus conference to address the use and interpretation of CF mutation analysis in clinical settings

    Memory Optimizations for Distributed Stream-based Applications

    No full text
    Distributed stream-based applications manage large quantities of data and exhibit unique production and consumption patterns that set them apart from general-purpose applications. This dissertation examines possible ways of creating more efficient memory management schemes. Specifically, it looks at the memory reclamation problem. It takes advantage of special traits of streaming applications to extend the definition of the garbage collection problem for those applications and include not only data items that are not reachable but also items that have no effect on the final outcome of the application. Streaming applications typically fully process only a portion of the data, and resources directed towards the remaining data items (i.e., those that dont affect the final outcome) can be viewed as wasted resources that should be minimized. Two complementary approaches are suggested: 1. Garbage Identification 2. Adaptive Resource Utilization Garbage Identification is concerned with an analysis of dynamic data dependencies to infer those items that the application is no longer going to access. Several garbage identification algorithms are examined. Each one of the algorithms uses a set of application properties (possibly distinct from one another) to reduce the memory consumption of the application. The performance of these garbage identification algorithms is compared to the performance of an ideal garbage collector, using a novel logging/post-mortem analyzer. The results indicate that the algorithms that achieve a low memory footprint (close to that of an ideal garbage collector) perform their garbage identification decisions locally; however, they base these decisions on best-effort global information obtained from other components of the distributed application. The Adaptive Resource Utilization (ARU) algorithm analyzes the dynamic relationships between the production and consumption of data items. It uses this information to infer the capacity of the system to process data items and adjusts data generation accordingly. The ARU algorithm makes local capacity decisions based on best-effort global information. This algorithm is found to be as effective as the most successful garbage identification algorithm in reducing the memory footprint of stream-based applications, thus confirming the observation that using best-effort global information to perform local decisions is fundamental in reducing memory consumption for stream-based applications.Ph.D.Committee Chair: Ramachandran, Umakishore; Committee Member: Ahamad, Mustaque; Committee Member: Fujimoto, Richard M.; Committee Member: Knobe, Kathleen; Committee Member: Prvulovic, Milo

    Dead Timestamp Identification in Stampede

    No full text
    Stampede is a parallel programming system to support computationally demanding applications including interactive vision, speech and multimedia collaboration. The system alleviates concerns such as communication, synchronization, and buffer management in programming such realtime stream-oriented applications. Threads are loosely connected by channels which hold streams of items, each identified by a timestamp. There are two performance concerns when programming with Stampede. The first is space, namely, ensuring that memory is not wasted on items bearing a timestamp that is not fully processed. The second is time, namely, ensuring that processing resource is not wasted on a timestamp that is not fully processed. In this paper we introduce a single unifying framework, dead timestamp identification, that addresses both the space and time concerns simultaneously. Dead timestamps on a channel represent garbage. Dead timestamps at a thread represent computations that need not be performed. This framework has been implemented in the Stampede system. Experimental results showing the space advantage of this framework are presented. Using a color-based people tracker application, we show that the space advantage can be significant (up to 40%) compared to the previous GC techniques in Stampede

    Rate Control for Threads in Streaming Applications

    Get PDF
    A large emerging class of interactive multimedia streaming applications can be represented as a coarse-grain, pipelined, dataflow graph. Such applications are ideal candidates for execution on a high-performance cluster. Each node in the graph represents a task component of the streaming application, where each task is consuming data from preceding stages, and producing data for subsequent stages. The amount of data computation performed by a task, is dependent on a multitude of design issues such as task algorithm latency, data dependency, resource availability etc. One common characteristic of these applications is the use of current data: A task would obtain the latest data from preceding stages, skipping over older data if necessary to perform its computation. Such applications when parallelized, waste resources in terms of processing and memory on data that is eventually dropped from the application pipeline. To overcome this problem, we have designed and implemented a distributed 'Rate Control' algorithm that dynamically adjusts the processing rate of each thread to meet application requirements. Optimizations incorporating application-level knowledge such as data-dependencies between producers and consumers, further improve performance. A color-based people tracker application is used to explore the performance benefits of the proposed Rate Control algorithm. We show that Rate Control reduces the application's memory footprint by 80% when compared to our previously published results. Optimizations further increase the application's throughput by 226%, and reduce latency by 40.5%
    • …
    corecore